47 research outputs found

    Force feedback facilitates multisensory integration during robotic tool use

    Get PDF
    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal congruency task, by responding to tactile vibrations applied to their hands, while ignoring visual distractors superimposed on the robotic tools. In the first experiment it was found that tool-use training with force feedback facilitates multisensory integration of signals from the tool, as reflected in a stronger crossmodal congruency effect with the force feedback training compared to training without force feedback and to no training. The second experiment extends these findings by showing that training with realistic online force feedback resulted in a stronger crossmodal congruency effect compared to training in which force feedback was delayed. The present study highlights the importance of haptic information for multisensory integration and extends findings from classical tool-use studies to the domain of robotic tools. We argue that such crossmodal congruency effects are an objective measure of robotic tool integration and propose some potential applications in surgical robotics, robotic tools, and human-tool interactio

    Neural signatures of visuo-motor integration during human-robot interactions

    Get PDF
    Visuo-motor integration shapes our daily experience and underpins the sense of feeling in control over our actions. The last decade has seen a surge in robotically and virtually mediated interactions, whereby bodily actions ultimately result in an artificial movement. But despite the growing number of applications, the neurophysiological correlates of visuo-motor processing during human-machine interactions under dynamic conditions remain scarce. Here we address this issue by employing a bimanual robotic interface able to track voluntary hands movement, rendered in real-time into the motion of two virtual hands. We experimentally manipulated the visual feedback in the virtual reality with spatial and temporal conflicts and investigated their impact on (1) visuo-motor integration and (2) the subjective experience of being the author of one's action (i.e., sense of agency). Using somatosensory evoked responses measured with electroencephalography, we investigated neural differences occurring when the integration between motor commands and visual feedback is disrupted. Our results show that the right posterior parietal cortex encodes for differences between congruent and spatially-incongruent interactions. The experimental manipulations also induced a decrease in the sense of agency over the robotically-mediated actions. These findings offer solid neurophysiological grounds that can be used in the future to monitor integration mechanisms during movements and ultimately enhance subjective experience during human-machine interactions

    Voluntary self-touch increases body ownership

    Get PDF
    Experimental manipulations of body ownership have indicated that multisensory integration is central to forming bodily self representation. Voluntary self touch is a unique multisensory situation involving corresponding motor, tactile and proprioceptive signals. Yet, even though self-touch is frequent in everyday life, its contribution to the formation of body ownership is not well understood. Here we investigated the role of voluntary self touch in body ownership using a novel adaptation of the rubber hand illusion (RHI), in which a robotic system and virtual reality allowed participants self-touch of real and virtual hands. In the first experiment, active and passive self-touch were applied in the absence of visual feedback. In the second experiment, we tested the role of visual feedback in this bodily illusion. Finally, in the third experiment, we compared active and passive self-touch to the classical RHI in which the touch is administered by the experimenter. We hypothesized that active self-touch would increase ownership over the virtual hand through the addition of motor signals strengthening the bodily illusion. The results indicated that active self-touch elicited stronger illusory ownership compared to passive self-touch and sensory only stimulation, and show an important role for active self-touch in the formation of bodily self

    Numerical Modelling of Human Organs for the Design of Robotic Systems

    No full text
    The bases to develop a tool for the design of a robotic platform, interacting with human organs, were developed in this thesis. The case of the stomach, faced in this work, represents only an example of a broader approach. The static point of view were completely characterized using a non-linear FEM model of the stomach. The model were developed using a data background available in literature, corcerning in-vivo and ex-corpus measures on all the organs of the abdominal cavity under laparoscopic conditions. The data were used to extrapolate a strain energy function and therefore for simuling the stomach as a hyperelastic material. The simulations were performed in ANSYS. Images taken from EPFL's visible human server were processed in SolidWorks and then used to build the stomach geometry. The model were preliminary validated by using an experimental test. The model allows to extrapolate a force-displacement curve of the stomach in order to simulate the interaction between the organ itself and the robotic system. The designer will be able to assess the values of forces and torques on the robotic system using the force-displacement curve to implement a non-linear spring in the x, y, z directions, and thus performing the simulation in CosmosMotion. A first possibility for the dynamic interaction were also tackled. We proposed to add a dashpot to the non-linear spring and thus to model the robotic system-organ interaction as a mass-spring-dashpot system. An initial way to characterized the dashpot, relating the damping factor with the stress-relaxation behavior of the organ, were shown. Anyhow, a proper analysis of the system should be performed by FE transient analysis verified by experimentals tests. At the end, two different configuartion (mass-spring-dashpot) to model the interaction in CosmosMotion were shown

    System and methods for using thereof for predicting hallucinations

    No full text
    A system for predicting a likelihood of an occurrence of hallucinations in a subject including a master device configured to be at least one of moved, moved on, and manipulated by a subject, a slave device operably connected with the master device and adapted so that the subject is directly or indirectly touched by the slave device according with the master device's movement, a computer device operably connected to both the master and the slave device, the computer device configured to modulate at least one of a time, space, and force activation of the slave device in response to an activation of the master device, record data regarding a difference in at least one of time, space and force activation, compare the recorded data with reference data, and graphically or numerically showing the result of the comparison on a display device, as an indicator of the likelihood of the occurrence of hallucinations in a subject

    Multisensory haptic system and method

    No full text
    A computer-implemented method for operating a haptic device, the haptic device comprising a plurality of tactile displays configured to provide haptic stimuli to a user, the method including the steps of (a) processing an audio signal derived from an audio file, thereby obtaining at least one profile of frequencies and amplitudes of the audio signal, (b) converting the frequencies and amplitudes profiles into a haptic profile, and (c) operating the haptic device according to the haptic profile

    Device, System, and Method for Robot-Controlled Induction of the Feeling of Human Presence

    No full text
    A method for inducing the feeling of a presence (FoP) in a subject by using a master-slave robotic system, the method including the steps of altering a visual perception of a surrounding environment of the subject, connecting the subject with a robotic master device so that the subject can move, move on or manipulate the robotic master device, connecting the subject with a robotic slave device, making the subject move, move on or manipulate the robotic master device so that the subject is directly or indirectly touched by the robotic slave device according to a movement of the robotic master device, wherein the robotic master device and the robotic slave device are operatively connected so that the subject receives at least one of spatially and temporally conflicting sensorimotor stimulation

    Multimodal Haptic Device, System, and Method of Using the Same

    No full text
    A multimodal haptic device operating as a closed-loop system, the device including a pipeline configured to allow a closed-loop flow of a fluid medium, a manifold operatively connected to the pipeline, the manifold having a pump and a valve to control and regulate a flow of the fluid medium along the pipeline, and a display unit operatively connected to the pipeline, the display unit having a tactile display and a valve operatively connected to the tactile display for regulating an efflux of the fluid medium from the tactile display into the pipeline

    Crossing the hands increases illusory self-touch.

    Get PDF
    Manipulation of hand posture, such as crossing the hands, has been frequently used to study how the body and its immediately surrounding space are represented in the brain. Abundant data show that crossed arms posture impairs remapping of tactile stimuli from somatotopic to external space reference frame and deteriorates performance on several tactile processing tasks. Here we investigated how impaired tactile remapping affects the illusory self-touch, induced by the non-visual variant of the rubber hand illusion (RHI) paradigm. In this paradigm blindfolded participants (Experiment 1) had their hands either uncrossed or crossed over the body midline. The strength of illusory self-touch was measured with questionnaire ratings and proprioceptive drift. Our results showed that, during synchronous tactile stimulation, the strength of illusory self-touch increased when hands were crossed compared to the uncrossed posture. Follow-up experiments showed that the increase in illusion strength was not related to unfamiliar hand position (Experiment 2) and that it was equally strengthened regardless of where in the peripersonal space the hands were crossed (Experiment 3). However, while the boosting effect of crossing the hands was evident from subjective ratings, the proprioceptive drift was not modulated by crossed posture. Finally, in contrast to the illusion increase in the non-visual RHI, the crossed hand postures did not alter illusory ownership or proprioceptive drift in the classical, visuo-tactile version of RHI (Experiment 4). We argue that the increase in illusory self-touch is related to misalignment of somatotopic and external reference frames and consequently inadequate tactile-proprioceptive integration, leading to re-weighting of the tactile and proprioceptive signals.The present study not only shows that illusory self-touch can be induced by crossing the hands, but importantly, that this posture is associated with a stronger illusion
    corecore